55 research outputs found

    An evaluation of the effect of terrain normalization on classification accuracy of Landsat ETM+ imagery

    Get PDF
    More than 60% of land in New Zealand has been converted from native forests to residential areas, agriculture, or forest plantations. Settlers brought many species of plants and animals to New Zealand. Many native species were unable to protect themselves from these new predators, causing numerous extinctions. In light of this rapid decline in biodiversity, the New Zealand government has attempted to mitigate the destruction of endemic flora and fauna through both new environmental policies and intensive land management. Land management techniques include the restoration of developed land and the protection of remaining areas of native forest. Monitoring of restoration efforts is important to the government and organizations responsible for this work. Using remotely sensed data to perform change analysis is a powerful method for long-term monitoring of restoration areas. The accuracy of maps created from remotely sensed data may be limited by significant terrain variation within many of the restoration areas. Landcare Research New Zealand has developed a topographic suppression algorithm that reduces the effects of topography. Landsat ETM+ imagery from November 2000 was processed with this algorithm to produce two images, an orthorectified image and a terrain-flattened image of a 50-km by 60-km area near Wanganui, New Zealand. Using GLOBE reference data collected on the ground in September/October 2004 and additional reference data photointerpreted from aerial photography, thematic maps were created using unsupervised, supervised, and hybrid classification methods. The accuracy of the thematic maps was evaluated using error matrices and Kappa analysis. The different image processing techniques were statistically compared. It was determined that the topographic-flattening algorithm did not significantly improve map accuracy

    Effect of Computer-Assisted Cognitive Behavior Therapy vs Usual Care on Depression Among Adults in Primary Care: A Randomized Clinical Trial

    Get PDF
    Importance Depression is a common disorder that may go untreated or receive suboptimal care in primary care settings. Computer-assisted cognitive behavior therapy (CCBT) has been proposed as a method for improving access to effective psychotherapy, reducing cost, and increasing the convenience and efficiency of treatment for depression. Objectives To evaluate whether clinician-supported CCBT is more effective than treatment as usual (TAU) in primary care patients with depression and to examine the feasibility and implementation of CCBT in a primary care population with substantial numbers of patients with low income, limited internet access, and low levels of educational attainment. Design, Setting, and Participants This randomized clinical trial included adult primary care patients from clinical practices at the University of Louisville who scored 10 or greater on the Patient Health Questionnaire–9 (PHQ-9) and were randomly assigned to CCBT or TAU for 12 weeks of active treatment. Follow-up assessments were conducted 3 and 6 months after treatment completion. Enrollment occurred from June 24, 2016, to May 13, 2019. The last follow-up assessment was conducted on January 30, 2020. Interventions CCBT included use of the 9-lesson computer program Good Days Ahead, along with as many as 12 weekly telephonic support sessions of approximately 20 minutes with a master’s level therapist, in addition to TAU, which consisted of the standard clinical management procedures at the primary care sites. TAU was uncontrolled, but use of antidepressants and psychotherapy other than CCBT was recorded. Main Outcomes and Measures The primary outcome measure (PHQ-9) and secondary outcome measures (Automatic Thoughts Questionnaire for negative cognitions, Generalized Anxiety Disorder–7, and the Satisfaction with Life Scale for quality of life) were administered at baseline, 12 weeks, and 3 and 6 months after treatment completion. Satisfaction with treatment was assessed with the Client Satisfaction Questionnaire–8. Results The sample of 175 patients was predominately female (147 of 174 [84.5%]) and had a high proportion of individuals who identified as racial and ethnic minority groups (African American, 44 of 162 patients who reported [27.2%]; American Indian or Alaska Native, 2 [1.2%]; Hispanic, 4 [2.5%]; multiracial, 14 [8.6%]). An annual income of less than $30 000 was reported by 88 of 143 patients (61.5%). Overall, 95 patients (54.3%) were randomly assigned to CCBT and 80 (45.7%) to TAU. Dropout rates were 22.1% for CCBT (21 patients) and 30.0% for TAU (24 patients). An intent-to-treat analysis found that CCBT led to significantly greater improvement in PHQ-9 scores than TAU at posttreatment (mean difference, −2.5; 95% CI, −4.5 to −0.8; P = .005) and 3 month (mean difference, −2.3; 95% CI, −4.5 to −0.8; P = .006) and 6 month (mean difference, −3.2; 95% CI, −4.5 to −0.8; P = .007) follow-up points. Posttreatment response and remission rates were also significantly higher for CCBT (response, 58.4% [95% CI, 46.4-70.4%]; remission, 27.3% [95% CI, 16.4%-38.2%]) than TAU (response, 33.1% [95% CI, 20.7%-45.5%]; remission, 12.0% [95% CI, 3.3%- 20.7%]). Conclusions and Relevance In this randomized clinical trial, CCBT was found to have significantly greater effects on depressive symptoms than TAU in primary care patients with depression. Because the study population included people with lower income and lack of internet access who typically have been underrepresented or not included in earlier investigations of CCBT, results suggest that this form of treatment can be acceptable and useful in diverse primary care settings. Additional studies with larger samples are needed to address implementation procedures that could enhance the effectiveness of CCBT and to examine potential factors associated with treatment outcome

    Premise Selection for Mathematics by Corpus Analysis and Kernel Methods

    Get PDF
    Smart premise selection is essential when using automated reasoning as a tool for large-theory formal proof development. A good method for premise selection in complex mathematical libraries is the application of machine learning to large corpora of proofs. This work develops learning-based premise selection in two ways. First, a newly available minimal dependency analysis of existing high-level formal mathematical proofs is used to build a large knowledge base of proof dependencies, providing precise data for ATP-based re-verification and for training premise selection algorithms. Second, a new machine learning algorithm for premise selection based on kernel methods is proposed and implemented. To evaluate the impact of both techniques, a benchmark consisting of 2078 large-theory mathematical problems is constructed,extending the older MPTP Challenge benchmark. The combined effect of the techniques results in a 50% improvement on the benchmark over the Vampire/SInE state-of-the-art system for automated reasoning in large theories.Comment: 26 page

    Time domains of the hypoxic ventilatory response in ectothermic vertebrates

    Get PDF
    Over a decade has passed since Powell et al. (Respir Physiol 112:123–134, 1998) described and defined the time domains of the hypoxic ventilatory response (HVR) in adult mammals. These time domains, however, have yet to receive much attention in other vertebrate groups. The initial, acute HVR of fish, amphibians and reptiles serves to minimize the imbalance between oxygen supply and demand. If the hypoxia is sustained, a suite of secondary adjustments occur giving rise to a more long-term balance (acclimatization) that allows the behaviors of normal life. These secondary responses can change over time as a function of the nature of the stimulus (the pattern and intensity of the hypoxic exposure). To add to the complexity of this process, hypoxia can also lead to metabolic suppression (the hypoxic metabolic response) and the magnitude of this is also time dependent. Unlike the original review of Powell et al. (Respir Physiol 112:123–134, 1998) that only considered the HVR in adult animals, we also consider relevant developmental time points where information is available. Finally, in amphibians and reptiles with incompletely divided hearts the magnitude of the ventilatory response will be modulated by hypoxia-induced changes in intra-cardiac shunting that also improve the match between O2 supply and demand, and these too change in a time-dependent fashion. While the current literature on this topic is reviewed here, it is noted that this area has received little attention. We attempt to redefine time domains in a more ‘holistic’ fashion that better accommodates research on ectotherms. If we are to distinguish between the genetic, developmental and environmental influences underlying the various ventilatory responses to hypoxia, however, we must design future experiments with time domains in mind

    Mechanical design of the optical modules intended for IceCube-Gen2

    Get PDF
    IceCube-Gen2 is an expansion of the IceCube neutrino observatory at the South Pole that aims to increase the sensitivity to high-energy neutrinos by an order of magnitude. To this end, about 10,000 new optical modules will be installed, instrumenting a fiducial volume of about 8 km3. Two newly developed optical module types increase IceCube’s current sensitivity per module by a factor of three by integrating 16 and 18 newly developed four-inch PMTs in specially designed 12.5-inch diameter pressure vessels. Both designs use conical silicone gel pads to optically couple the PMTs to the pressure vessel to increase photon collection efficiency. The outside portion of gel pads are pre-cast onto each PMT prior to integration, while the interiors are filled and cast after the PMT assemblies are installed in the pressure vessel via a pushing mechanism. This paper presents both the mechanical design, as well as the performance of prototype modules at high pressure (70 MPa) and low temperature (−40∘C), characteristic of the environment inside the South Pole ice

    The next generation neutrino telescope: IceCube-Gen2

    Get PDF
    The IceCube Neutrino Observatory, a cubic-kilometer-scale neutrino detector at the geographic South Pole, has reached a number of milestones in the field of neutrino astrophysics: the discovery of a high-energy astrophysical neutrino flux, the temporal and directional correlation of neutrinos with a flaring blazar, and a steady emission of neutrinos from the direction of an active galaxy of a Seyfert II type and the Milky Way. The next generation neutrino telescope, IceCube-Gen2, currently under development, will consist of three essential components: an array of about 10,000 optical sensors, embedded within approximately 8 cubic kilometers of ice, for detecting neutrinos with energies of TeV and above, with a sensitivity five times greater than that of IceCube; a surface array with scintillation panels and radio antennas targeting air showers; and buried radio antennas distributed over an area of more than 400 square kilometers to significantly enhance the sensitivity of detecting neutrino sources beyond EeV. This contribution describes the design and status of IceCube-Gen2 and discusses the expected sensitivity from the simulations of the optical, surface, and radio components

    Sensitivity of IceCube-Gen2 to measure flavor composition of Astrophysical neutrinos

    Get PDF
    The observation of an astrophysical neutrino flux in IceCube and its detection capability to separate between the different neutrino flavors has led IceCube to constraint the flavor content of this flux. IceCube-Gen2 is the planned extension of the current IceCube detector, which will be about 8 times larger than the current instrumented volume. In this work, we study the sensitivity of IceCube-Gen2 to the astrophysical neutrino flavor composition and investigate its tau neutrino identification capabilities. We apply the IceCube analysis on a simulated IceCube-Gen2 dataset that mimics the High Energy Starting Event (HESE) classification. Reconstructions are performed using sensors that have 3 times higher quantum efficiency and isotropic angular acceptance compared to the current IceCube optical modules. We present the projected sensitivity for 10 years of data on constraining the flavor ratio of the astrophysical neutrino flux at Earth by IceCube-Gen2

    Estimating the coincidence rate between the optical and radio array of IceCube-Gen2

    Get PDF
    The IceCube-Gen2 Neutrino Observatory is proposed to extend the all-flavour energy range of IceCube beyond PeV energies. It will comprise two key components: I) An enlarged 8km3 in-ice optical Cherenkov array to measure the continuation of the IceCube astrophysical neutrino flux and improve IceCube\u27s point source sensitivity above ∼100TeV; and II) A very large in-ice radio array with a surface area of about 500km2. Radio waves propagate through ice with a kilometer-long attenuation length, hence a sparse radio array allows us to instrument a huge volume of ice to achieve a sufficient sensitivity to detect neutrinos with energies above tens of PeV. The different signal topologies for neutrino-induced events measured by the optical and in-ice radio detector - the radio detector is mostly sensitive to the cascades produced in the neutrino interaction, while the optical detector can detect long-ranging muon and tau leptons with high accuracy - yield highly complementary information. When detected in coincidence, these signals will allow us to reconstruct the neutrino energy and arrival direction with high fidelity. Furthermore, if events are detected in coincidence with a sufficient rate, they resemble the unique opportunity to study systematic uncertainties and to cross-calibrate both detector components

    Direction reconstruction performance for IceCube-Gen2 Radio

    Get PDF
    The IceCube-Gen2 facility will extend the energy range of IceCube to ultra-high energies. The key component to detect neutrinos with energies above 10 PeV is a large array of in-ice radio detectors. In previous work, direction reconstruction algorithms using the forward-folding technique have been developed for both shallow (≲20 m) and deep in-ice detectors, and have also been successfully used to reconstruct cosmic rays with ARIANNA. Here, we focus on the reconstruction algorithm for the deep in-ice detector, which was recently introduced in the context of the Radio Neutrino Observatory in Greenland (RNO-G)
    corecore